1197. Minimum Knight Moves
Medium

In an infinite chess board with coordinates from -infinity to +infinity, you have a knight at square [0, 0].

A knight has 8 possible moves it can make, as illustrated below. Each move is two squares in a cardinal direction, then one square in an orthogonal direction.

Return the minimum number of steps needed to move the knight to the square [x, y].  It is guaranteed the answer exists.

 

Example 1:

Input: x = 2, y = 1
Output: 1
Explanation: [0, 0] → [2, 1]

Example 2:

Input: x = 5, y = 5
Output: 4
Explanation: [0, 0] → [2, 1] → [4, 2] → [3, 4] → [5, 5]

 

Constraints:

  • |x| + |y| <= 300
Accepted
72,016
Submissions
188,536
Seen this question in a real interview before?
Companies
Related Topics
Show Hint 1
You can simulate the movements since the limits are low.
Show Hint 2
Is there a search algorithm applicable to this problem?
Show Hint 3
Since we want the minimum number of moves, we can use Breadth First Search.

Average Rating: 5.00 (26 votes)

Premium

Solution


Intuition

We are asked to find the minimum steps required to travel from one point to another point on a grid. One of the most intuitive ideas that one might come up with during an interview is brute force enumeration, i.e. we explore all possible paths between the two points and return the minimum number of moves.

It is indeed a valid solution. However, it will be beneficial to give the idea a second thought, before jumping into the implementation. The question resembles a classical graph search problem, which is to find the shortest path between two nodes in a graph. As one might recall, the solution to the graph search problem is called Dijkstra's algorithm.

One of the key ideas within the Dijkstra's algorithm is the strategy of BFS (Breadth-First Search), as opposed to DFS (Depth-First Search). With BFS we make sure that before exploring further towards the destination, all the immediate neighborhoods are properly explored.

The key to solving this problem is based on the BFS strategy. The idea is that starting from the origin, we explore the neighborhood following the order that is determined by the distance to the origin, i.e. we first explore all the points within a single step from the origin, then we explore all the points that can be reached with two steps, so on and so forth.

During the exploration process, as soon as we reach the target point, we then can call the current path the shortest path, since our exploration follows the order of distance.

One can imagine the whole process as if we send a sound wave to determine the distance to an unknown object. The sound wave propagates in all directions with the same speed, while its scope grows as a circle. Once the circle reaches the target object, the radius is the shortest distance between the origin and the target object.

BFS

Algorithm

As a reminder, BFS is a pattern for a collection of algorithms, rather than a specific algorithm. Additionally, given a specific BFS algorithm, there are several sub-patterns in terms of how the algorithm is implemented.

Despite the differences, all BFS algorithms share the usage of two important data structures: queue and set (or map). A queue is used to maintain the order in which places are visited, while a set/map is used to mark which places have already been visited.

For reference, we provide a pattern to implement BFS algorithms in our Queue and BFS Explore Card. Following the strategy of BFS, here we provide a sample algorithm to solve the problem with the following steps:

  • First, we create a queue data structure to store the places to be visited during the next step and a set data structure named visited to keep track of all the places that we have visited so far.

  • The process of BFS consists of a loop that spins around the queue. The loop ends either when we reach the target or when the queue is empty.

  • Within the main loop, we use a nested loop to iterate over the current elements in the queue. All of the elements are of the same distance from the starting point.

  • Within the nested loop, we prepare the elements that will be visited during the next step.

Note: In Java, the HashSet is not the most efficient data structure. For this reason, using the HashSet data structure to keep track of the visited cells in the Java implementation will result in the TLE (Time Limit Exceeded) exception.

To avoid the TLE exception, we use a bitmap (i.e. a two-dimentional array of boolean values) instead of a HashSet. The range of the array is set according to the constraint of the input (i.e. x+y<=300|x| + |y| <= 300).

Complexity Analysis

Given the coordinate of the target as (x,y)(x, y), the number of cells covered by the circle that is centered at point (0,0)(0, 0) and reaches the target point is roughly (max(x,y))2\big(\max(|x|, |y|)\big) ^ 2.

  • Time Complexity: O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg)

    • Due to the nature of BFS, before reaching the target, we will have covered all the neighborhoods that are closer to the start point. The aggregate of these neighborhoods forms a circle, and the area can be approximated by the area of a square with an edge length of max(2x,2y)\max(2|x|, 2|y|). The number of cells within this square would be (max(2x,2y))2\big(\max(2|x|, 2|y|)\big) ^ 2.

    • Hence, the overall time complexity of the algorithm is O((max(2x,2y))2)=O((max(x,y))2)O\bigg(\big(\max(2|x|, 2|y|)\big) ^ 2 \bigg) = O\bigg(\big(\max(|x|, |y|)\big) ^ 2 \bigg).

  • Space Complexity: O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg)

    • We employed two data structures in the algorithm, i.e. queue and set.

    • At any given moment, the queue contains elements that are situated in at most two different layers (or levels). In our case, the maximum number of elements at one layer would be 4max(x,y)4 \cdot \max(|x|, |y|), i.e. the perimeter of the exploration square. As a result, the space complexity for the queue is O(max(x,y))O\big( \max(|x|, |y|) \big).

    • As for the set, it will contain every elements that we visited, which is (max(x,y))2\big(\max(|x|, |y|)\big) ^ 2 as we estimated in the time complexity analysis. As a result, the space complexity for the set is O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg).

    • To sum up, the overall space complexity of the algorithm is O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg), which is dominated by the space used by the set.


Approach 2: Bidirectional BFS

Intuition

Based on the above idea of BFS, one optimization that we can apply is to perform bidirectional exploration instead of unidirectional exploration.

The reason why the bidirectional BFS is an optimized solution is illustrated in the following graph:

bidirectional BFS

Intuitively, as we can see from the above graph, the area of the orange circles that we explore with bidirectional BFS is much smaller than the area of the red circle that we would explore with unidirectional BFS (twice as small, to be exact).

This can be proved mathematically. Suppose that the distance between the start and target points is dd, the exploration scope covered by the unidirectional BFS will be a circle whose area is πd2\pi \cdot d^2. On the other hand, with the bidirectional BFS, the exploration scope will be two smaller circles whose total area is 2π(d2)2=πd222 \cdot \pi (\frac{d}{2})^2 = \pi \cdot \frac{d^2}{2}, i.e. half of the area covered by the unidirectional BFS.

Algorithm

To implement the bidirectional BFS algorithm, we will double the usage of the data structures in the unidirectional BFS. Additionally, we need to make the following adaptations:

  • Instead of using the set data structure to keep track of the visited places, we use the map data structure, which contains not only the information of visited places but also the distance between each place and the starting point.

  • Instead of only storing the coordinates of the next places to be visited in the queue, we also store the distance between each place and the starting point. This way we don't need an extra variable to keep track of distance.

With the two adaptations listed above, we can make the implementation more concise and clear. Here are some sample implementations.

Note: in theory, the above implementation of bidirectional BFS should be faster than the unidirectional BFS. However, in reality, this is not the case for the Java implementation, due to heavy usage of sophisticated data structures, which are inefficient compared to simple arrays.

In addition to the bidirectional exploration optimization, there is also a technique called pruning that has been mentioned in some posts.

Pruning means to remove the unwanted parts, and that is exactly what this technique does. It focuses only on the directions that might eventually lead to the discovery of the target while ignoring other directions thereby reducing our total search space.

Indeed, it can improve our performance. However, it can be tricky to ensure the correctness of the algorithm. If we accidentally prune a valid branch, we might fail to obtain the correct answer in the end. Under the circumstance of passing an interview in a short time frame, one must weigh the risks associated with prunning against the potential gain.

Complexity Analysis

Although the bidirectional BFS cuts the exploration scope in half, compared to the unidirectional BFS, the overall time and space complexities remain the same. We will break it down in detail in this section.

First of all, given the target's coordinate, (x,y)(x, y), then the area that is covered by the two exploratory circles of the bidirectional BFS will be (max(x,y))22\frac{\big(\max(|x|, |y|)\big) ^ 2}{2}.

  • Time Complexity: O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg)

    • Reducing the scope of exploration by half does speed up the algorithm. However, it does not change the time complexity of the algorithm which remains O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg).
  • Space Complexity: O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg)

    • In exchange for reducing the search scope, we double the usage of data structures compared to the unidirectional BFS. Similarly to the time complexity, multiplying the required space by two does not change the overall space complexity of the algorithm which remains O((max(x,y))2)O\bigg(\big(\max(|x|, |y|)\big) ^ 2\bigg).


Approach 3: DFS (Depth-First Search) with Memoization

Intuition

As opposed to BFS, DFS (Depth-First Search) is a pattern of an exploration algorithm, which prioritizes the depth over breadth during the exploration process.

It is not surprising that problems that can be solved with BFS can also be solved with DFS. However, in most scenarios one method is more efficient than the other. Indeed, this is also the case for this problem.

Symmetry of Solutions

Before explaining how we can apply the DFS algorithm to this problem, we should address the symmetry of the answers, which we haven't touched on so far.

We claim that the target (x,y)(x, y), its horizontally, vertically, and diagonally symmetric points (i.e. (x,y),(x,y),(x,y)(x, -y), (-x, y), (-x, -y)) share the same answer as the target point.

Due to the symmetry of the board (i.e. from -infinity to +infinity) and the symmetry of the allowed movements, we can rest assured that the above claim is correct, without rigid mathematical proof.

Based on the above insight, we can focus on the first quadrant of the coordinate plane where both xx and yy are positive. Any target that is outside of the first quadrant, can be shifted to its symmetric point in the first quadrant by taking the absolute value of each coordinate, i.e. (x,y)(|x|, |y|).

At the beginning of the DFS as well as during the process of DFS, we will always shift the exploration to the first quadrant.

Reduced Directions

Now that we have contained the exploration within the first quadrant, we can further focus on which directions we employ. For a target that is situated in the first quadrant, though technically we could move in 8 different directions, there are only two directions (i.e. left-down and down-left) that will move us closer to the origin.

Indeed, before we reach the immediate neighborhood of the origin, we only need to explore the two left-down directions (with offsets of (1,2)(-1, -2) and (2,1)(-2, -1)), since the rest of the directions will deviate further away from the origin.

The immediate neighborhood of the origin, refers to the points of where the sum of coordinates is less than or equal to 2, i.e. x+y<=2x + y <= 2. In order to reach an immediate neighbor point from the origin, we need to do a bit of zigzag. In the following graph, we show examples of how to reach some of the immediate neighbors via zigzag steps.

DFS base cases

As it turns out, any immediate neighbors with (x+y==2x + y == 2), takes exactly 2 steps to reach when starting from the origin. One can exhaustively verify the above insight.

Algorithm

With the above insights in mind, we can begin to work on our DFS algorithm.

Rather than starting from the origin, we start from the target and walk backwards to reach the origin. Also, instead of exploring all 8 directions, we only need to explore the two left-down directions as we discussed before.

Assume that the function dfs(x, y) returns the minimum steps required to reach the target point (x, y), the idea of DFS can be expressed in the following formula:

dfs(x,y)=min(dfs(x2,y1),dfs(x1,y2))+1 \text{dfs}(x, y) = \min\big(\text{dfs}(|x-2|, |y-1|), \text{dfs}(|x-1|, |y-2|)\big) + 1

The formula can be interpreted as such: at each step of the backward exploration process, by only exploring the left-down directions we can obtain the shortest path.

As one might notice, the above function is a recursive function. And it is critical to define the base cases to make the definition sound. There are in general two base cases:

  • case 1): x=0, y=0, when we reach the origin, no further steps are required to reach our goal, i.e. dfs(x, y) = 0.

  • case 2): x + y = 2, when we are at a immediate neighbor as we discussed before, it takes two more steps to reach our goal, i.e. dfs(x, y) = 2.

Note: one might argue that there is another base case to cover, which is x + y = 1, e.g. x=1, y=0. This is not our base case though, because by taking one more step further, it will be reduced down to our base case 2), i.e. |x-1| + |y-2| = 2.

Given the above definitions, one can intuitively implement them with a recursive function. Additionally, it is important to apply the memoization technique to prevent duplicate calculations from occurring during the recursive process.

The above form of recursion with memoization is also known as Top-Down Dynamic Programming, where we work out the solutions from top to down (base cases), and we reuse the intermediate results (with memoization) to speed up the calculation.

It is also feasible to start from the origin and move towards the target. Accordingly, we should adapt the conditions in the base cases.

Complexity Analysis

Let (x,y)(x, y) be the coordinate of the target.

  • Time Complexity: O(xy)O(|x \cdot y|)

    • The execution of our recursive algorithm will unfold as a binary tree where each node represents an invocation of the recursive function. And the time complexity of the algorithm is proportional to the total number of invocations, i.e. total number of nodes in the binary tree.

    • The total number of nodes grows exponentially in the binary tree. However, there will be some overlap in terms of the invocations, i.e. dfs(x,y) might be invoked multiple times with the same input. Thanks to the memoization technique, we avoid redundant calculations, i.e. the return value of dfs(x,y) is stored for reuse later, which greatly improves the performance.

    • In the algorithm, we restrict the exploration to the first quadrant of the board. Therefore, in the worst case, we will explore all of the cells between the origin and the target in the first quadrant. In total, there are xy|x \cdot y| cells in a rectangle that spans from the origin to the target. As a result, the overall time complexity of the algorithm is O(xy)O(|x \cdot y|).

  • Space Complexity: O(xy)O(|x \cdot y|)

    • First of all, due to the presence of recursion in the algorithm, it will incur additional memory consumption in the function call stack. The consumption is proportional to the level of the execution tree, i.e. max(x,y)\max(|x|, |y|).

    • Secondly, due to the application of memoization technique, we will keep all the intermediate results in the memory for reuse. As we have seen in the above time complexity analysis, the maximum number of intermediate results will be O(xy)O(|x \cdot y|).

    • To sum up, the overall space complexity of the algorithm is O(xy)O(|x \cdot y|), which is dominated by the memoization part.


Follow-up

For math lovers, as it turns out, we can derive a mathematical formula based on the DFS solution, which can solve the problem in O(1)O(1) time complexity.

Fore more details, check out the post by poorvank in the discussion forum.

The solution is definitely brilliant. However, it goes without saying that it is unlikely that one would be expected to come up with such a solution during an interview. It could be equally difficult to defend and explain the solution to the interviewer.


Report Article Issue

Comments: 8
kishorekumar21's avatar
Read More

Hands down this is one of the best solution article on Leetcode. Whoever wrote this deserves a pay raise. Thanks a lot.

17
Reply
Share
Report
lyharrietbui's avatar
Read More

I don't quite understand why we use 605 for boolean[][] visited = new boolean[605][605]; based on |x| + |y| <= 300∣x∣+∣y∣<=300

3
Show 4 replies
Reply
Share
Report
pavan_nihal's avatar
Read More

I got TLE for using BFS with HashMap. I came to see the solution and found out that an Array should be used instead. While this is a nice idea, I feel that a Medium problem should not have such a tight time constraint that using a HashMap instead of Array will cause TLE.

3
Show 3 replies
Reply
Share
Report
yi64's avatar
Read More

Approach 2 & 3 are trying to narrow down the searching space. Actually, we can use A* algorithm to achieve that goal. Here's my Python3 solution by applying A* algorithm.

    def minKnightMoves(self, x: int, y: int) -> int:
        def heuristic(xx, yy):
            return (abs(xx-x) + abs(yy-y)) // 3
        directs = [(-2,+1), (-1,+2), (+1,+2), (+2,+1), (+2,-1), (+1,-2), (-1,-2), (-2,-1)]
        frontier = [(0, 0, 0)]
        stepSoFar = {(0, 0): 0}
        while frontier:
            priority, currX, currY = heappop(frontier)
            currStep = stepSoFar[(currX, currY)]
            if currX == x and currY == y:
                return currStep
            for dx, dy in directs:
                nextX, nextY = currX+dx, currY+dy
                nextStep = currStep + 1
                if (nextX, nextY) not in stepSoFar or nextStep < stepSoFar[(nextX, nextY)]:
                    stepSoFar[(nextX, nextY)] = nextStep
                    priority = nextStep + heuristic(nextX, nextY)
                    heappush(frontier, (priority, nextX, nextY))

2
Reply
Share
Report
panicking_kernel's avatar
Read More

In the Approach #3, how is it guaranteed that the knight will end up on the base case squares (1, 1), (0, 2), (2, 0) and will not miss them?

1
Show 1 reply
Reply
Share
Report
patrickop's avatar
Read More

By making some observations about the symmetry and special cases (and no maths at all), we can achieve a linear solution. With recursion, this is very simple to explain in an interview with some drawings, and beats 100% (time) and 97% (memory). The recursion depth would be O(x+y), which is not too bad.

    int minKnightMoves(int x, int y) {
        // We work from the target location back to the origin (knight moves are reversible)
        // base case, we are at the origin
        if (x == 0 and y == 0) return 0;  
        // observe: Solutions are symmetrical. No matter which quadrant of the coordinate system we are in,
        // the steps to get there are the same amount
        if (x < 0 or y < 0) return minKnightMoves(abs(x), abs(y));
        // observe: The steps to get to (x,y) are the same amount as to get to (y,x)
        if (y > x) return minKnightMoves(y,x);
        // Now observe: in all but 4 of the remaining cases, the optimal jump is left 2 down 1
        // Here, we must go two left and 1 UP
        if (x == 3 and y == 1) return 1 + minKnightMoves(x-2,y+1);
        if (x == 4 and y == 3) return 1 + minKnightMoves(x-2,y+1);
        if (x == 1 and y == 1) return 1 + minKnightMoves(x-2,y+1);
        // Here, we must go two UP and 1 RIGHT
        if (x == 2 and y == 0) return 1 + minKnightMoves(x-1,y+2);
        return 1+minKnightMoves(x-2, y-1);
    }

1
Reply
Share
Report
nagakiran1's avatar
Read More

would someone tell how (0,1) take 3 steps :-|

0
Show 1 reply
Reply
Share
Report
akki06's avatar
Read More

I think even BFS can achieve the time complexity mentioned for the DFS approach because that optimization is more about using the symmetry & moving in the right direction (rather than DFS/BFS methodology).

Here is my code. I measured the complexity of my code & it seems to depend on both x & y (not just max(x, y)).

from collections import deque

class Solution:
    def minKnightMoves(self, x: int, y: int) -> int:
        if x == 0 and y == 0:
            return 0
        MOVES = [
            (1, 2), (2, 1),
            (-1, 2), (2, -1),
            (1, -2), (-2, 1),
        ]
        x = abs(x)
        y = abs(y)
        q = deque([((0, 0), 0)])

        visited = set([])
        time_complexity = 0
        while q:
            time_complexity += 1
            coords, moves = q.popleft()
            if coords in visited:
                continue
            visited.add(coords)
            for pos in MOVES:
                time_complexity += 1
                new_pos = (coords[0] + pos[0], coords[1] + pos[1])
                if new_pos in visited:
                    continue
                # If it is far & going in opposite direction, let's chuck it
                nearby = abs(new_pos[0] - x) < 4 and abs(new_pos[1] - y) < 4
                correct_direction = all([
                    coords[0] - 1 <= new_pos[0] <= x + 1,
                    coords[1] - 1 <= new_pos[1] <= y + 1
                ])
                if nearby or correct_direction:
                    if new_pos == (x, y):
                        print('x, y, time_complexity:', x, y, time_complexity)
                        return (moves + 1)
                    else:
                        q.append((new_pos, moves + 1))

0
Reply
Share
Report

You don't have any submissions yet.

1197/1883
Contribute
|||
Saved
#1151 Minimum Swaps to Group All 1's Together
Medium
#1152 Analyze User Website Visit Pattern
Medium
#1153 String Transforms Into Another String
Hard
#1154 Day of the Year
Easy
#1155 Number of Dice Rolls With Target Sum
Medium
#1156 Swap For Longest Repeated Character Substring
Medium
#1157 Online Majority Element In Subarray
Hard
#1158 Market Analysis I
Medium
#1159 Market Analysis II
Hard
#1160 Find Words That Can Be Formed by Characters
Easy
#1161 Maximum Level Sum of a Binary Tree
Medium
#1162 As Far from Land as Possible
Medium
#1163 Last Substring in Lexicographical Order
Hard
#1164 Product Price at a Given Date
Medium
#1165 Single-Row Keyboard
Easy
#1166 Design File System
Medium
#1167 Minimum Cost to Connect Sticks
Medium
#1168 Optimize Water Distribution in a Village
Hard
#1169 Invalid Transactions
Medium
#1170 Compare Strings by Frequency of the Smallest Character
Medium
#1171 Remove Zero Sum Consecutive Nodes from Linked List
Medium
#1172 Dinner Plate Stacks
Hard
#1173 Immediate Food Delivery I
Easy
#1174 Immediate Food Delivery II
Medium
#1175 Prime Arrangements
Easy
#1176 Diet Plan Performance
Easy
#1177 Can Make Palindrome from Substring
Medium
#1178 Number of Valid Words for Each Puzzle
Hard
#1179 Reformat Department Table
Easy
#1180 Count Substrings with Only One Distinct Letter
Easy
#1181 Before and After Puzzle
Medium
#1182 Shortest Distance to Target Color
Medium
#1183 Maximum Number of Ones
Hard
#1184 Distance Between Bus Stops
Easy
#1185 Day of the Week
Easy
#1186 Maximum Subarray Sum with One Deletion
Medium
#1187 Make Array Strictly Increasing
Hard
#1188 Design Bounded Blocking Queue
Medium
#1189 Maximum Number of Balloons
Easy
#1190 Reverse Substrings Between Each Pair of Parentheses
Medium
#1191 K-Concatenation Maximum Sum
Medium
#1192 Critical Connections in a Network
Hard
#1193 Monthly Transactions I
Medium
#1194 Tournament Winners
Hard
#1195 Fizz Buzz Multithreaded
Medium
#1196 How Many Apples Can You Put into the Basket
Easy
#1197 Minimum Knight Moves
Medium
#1198 Find Smallest Common Element in All Rows
Medium
#1199 Minimum Time to Build Blocks
Hard
#1200 Minimum Absolute Difference
Easy